Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Free, publicly-accessible full text available June 10, 2026
- 
            Free, publicly-accessible full text available June 10, 2026
- 
            This study investigates small group collaborative learning with a technology-supported environment. We aim to reveal key aspects of collaborative learning by examining variations in interaction, the influence of small group collaboration on science knowledge integration, and the implications for individual knowledge mastery. Results underscore the importance of high-quality science discourse and user-friendly tools. The study also highlights that group-level negotiations may not always affect individual understanding. Overall, this research offers insights into the complexities of collaboration and its impact on science learning.more » « less
- 
            This study explored the Idea Wall, a collaborative knowledge-building tool to support students’ collaboration in small groups during a plant biology science curriculum. We examined the affordances and challenges of the Idea Wall and found the effective use of the tool's spatial organization capabilities by students, particularly the Yup Zone and the intermediary neutral spaces, for collaboratively organizing notes. But there's also a need for improvements in some features of the tool’s design and instructional guidance.more » « less
- 
            Abstract As use of artificial intelligence (AI) has increased, concerns about AI bias and discrimination have been growing. This paper discusses an application called PyrEval in which natural language processing (NLP) was used to automate assessment and provide feedback on middle school science writing without linguistic discrimination. Linguistic discrimination in this study was operationalized as unfair assessment of scientific essays based on writing features that are not considered normative such as subject‐verb disagreement. Such unfair assessment is especially problematic when the purpose of assessment is not assessing English writing but rather assessing the content of scientific explanations. PyrEval was implemented in middle school science classrooms. Students explained their roller coaster design by stating relationships among such science concepts as potential energy, kinetic energy and law of conservation of energy. Initial and revised versions of scientific essays written by 307 eighth‐grade students were analyzed. Our manual and NLP assessment comparison analysis showed that PyrEval did not penalize student essays that contained non‐normative writing features. Repeated measures ANOVAs and GLMM analysis results revealed that essay quality significantly improved from initial to revised essays after receiving the NLP feedback, regardless of non‐normative writing features. Findings and implications are discussed. Practitioner notesWhat is already known about this topicAdvancement in AI has created a variety of opportunities in education, including automated assessment, but AI is not bias‐free.Automated writing assessment designed to improve students' scientific explanations has been studied.While limited, some studies reported biased performance of automated writing assessment tools, but without looking into actual linguistic features about which the tools may have discriminated.What this paper addsThis study conducted an actual examination of non‐normative linguistic features in essays written by middle school students to uncover how our NLP tool called PyrEval worked to assess them.PyrEval did not penalize essays containing non‐normative linguistic features.Regardless of non‐normative linguistic features, students' essay quality scores significantly improved from initial to revised essays after receiving feedback from PyrEval. Essay quality improvement was observed regardless of students' prior knowledge, school district and teacher variables.Implications for practice and/or policyThis paper inspires practitioners to attend to linguistic discrimination (re)produced by AI.This paper offers possibilities of using PyrEval as a reflection tool, to which human assessors compare their assessment and discover implicit bias against non‐normative linguistic features.PyrEval is available for use ongithub.com/psunlpgroup/PyrEvalv2.more » « less
- 
            Writing and revising scientific explanations helps students integrate disparate scientific ideas into a cohesive understanding of science. Natural language processing technologies can help assess students’ writing and give corresponding feedback, which supports their writing and revision of their scientific ideas. However, the feedback is not always helpful to students. Our study investigated 241 middle school students’ a) use of feedback, b) how it affected their revisions, and c) how these factors affected students’ writing improvement. We found that students made more content-related revisions when they used feedback. Making content-related revisions also assisted students in improving their writing. But students still found it difficult to make integrated revisions and did not use feedback often. Additional support to assist students to understand and use feedback, especially for students with limited science knowledge, is needed.more » « less
- 
            Examining the effect of automated assessments and feedback on students’ written science explanationsWriting scientific explanations is a core practice in science. However, students find it difficult to write coherent scientific explanations. Additionally, teachers find it challenging to provide real-time feedback on students’ essays. In this study, we discuss how PyrEval, an NLP technology, was used to automatically assess students’ essays and provide feedback. We found that students explained more key ideas in their essays after the automated assessment and feedback. However, there were issues with the automated assessments as well as students’ understanding of the feedback and revising their essays.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                     Full Text Available
                                                Full Text Available